Skip to main content

Deconstructing Pi

Deconstructing Pi
Image
Deconstructing Pi Primary

8949 Wilshire Boulevard
Beverly Hills, CA

How long would it take a single computer to render all of the visual effects shots in Life of Pi, the Oscar-winning 2012 movie? About 1,633 years, believe it or not. That was just one of the revelations at the Academy event “Deconstructing Pi” on Monday, May 6, 2013, at the Samuel Goldwyn Theater, where a sold-out crowd found out how cinematic wizards brought to life the unforgettable adventure of a boy adrift on the ocean with a Bengal tiger named Richard Parker.

John Bailey, an Academy governor representing the Cinematographers Branch, welcomed the audience after an introductory video showing the previsualization, or “previs,” for the film’s “God Storm” sequence in which Pi is hurled into a turbulent sea and submits completely to a higher power.

This previs version was generated with computers to lay out the visual paths of each shot, indicating how the stereoscopic flow and compositions would work in the final cut (which was also shown). Bailey noted how the use of stereoscopic photography has evolved considerably since he first experienced it in a theater screening of the Vincent Price film “House of Wax” (1953).

The host for the evening, Academy governor Bill Kroyer from the Short Films and Feature Animation Branch, showed the evolution of the 3D format through still images and a few props – including a Victorian-era stereoscope and a vintage View-Master, much to the audience’s delight. He also invited attendees to check out a display in the lobby containing artifacts from the production of the film, including precision mapped and stereoscopically paired lenses, a camera agnostic 3D sled and one of the underwater camera systems used to capture the film’s many underwater shots.

The amount of detail and precision required in cinematic 3D today is truly at the cutting edge of technology, and the evening’s guests bore that out as they each described their contributions to the film. “It’s gonna be really hard!” was the initial reaction of the first speaker, Oscar-nominated editor Tim Squyres, who has edited all but one of director Ang Lee’s films since his debut feature, The Wedding Banquet (1993).

The entire panel was gathered onstage for a brief wrap-up, with Squyres getting the biggest gasp when he explained that editing the film required him to wear 3D glasses almost all day for two years. How did he cope? “A lot of Advil!” He also echoed one of the remarks Westenhofer made at the start of production, which served as a perfect summary for the whole ambitious adventure: “We’ll know how to make this movie when we finish it.”

Life of Pi required a tiger that looked absolutely real and blended seamlessly with the live animal used on the set, Squyres recalled. Furthermore, he felt that it was important for the film’s visual beauty to complement the emotional and spiritual aspects of the story, which originated as a bestselling 2001 novel by Yann Martel. Squyers explained that the film’s aesthetic had to be achieved with a delicate use of stereoscopic photography that wouldn’t strain viewers’ eyes. He went on to demonstrate format principles such as “IO” (shorthand for interocular distance),the distance between the human eyes that causes slight perspective differences in left eye/right eye images of an object, enabling depth perception; and convergence, the point at which the left eye/left eye images seem to come together, locating the object in space.

Speaking next was Brad Alexander, a partner and senior previsualization supervisor at HALON Entertainment. Some of the rare material he unveiled included early conceptual 3D shot designs for the film, which were done in the anaglyph (red and blue) stereoscopic format because sophisticated 3D monitors weren’t readily available. He also showed how elements like the ocean environment, the figure of Pi himself, and important props like the lifeboat and raft were created via computer in rough form for what could be considered digital storyboards. The HALON team created a “stereo script” for the film, determining whether each shot would work in both 3D and 2D projection. One example included the film’s spectacular shipwreck scene, which was shown both in previs and final forms (with Pi suspended underwater, watching, as the ship plunges into the deep). From start to finish, the previs process for the film took an astonishing two years and four months.

A different perspective on Life of Pi was given by cinematographer Claudio Miranda, who has shot such films as The Curious Case of Benjamin Button (2008), Tron: Legacy (2010) and Oblivion (2013), and who won his first Oscar for Life of Pi. He had to be especially concerned with the presentation of light in the film, equipping the watery set with different setups to accommodate differing grades of sunlight ranging from sunrise to blazing high noon to sunset. He also had to keep in mind the pacing of 3D moments in shots from 1 to 5, with 5 being the most aggressive (and sparingly used to avoid wearing out audiences’ eyes).

One highlight was Miranda explaining how an early scene during Pi’s opening narration about his parents was achieved using natural candle light – but in this case that meant 20,000 candles floating in the water. Miranda’s fondness for low light was easy to accommodate since the film was shot digitally and didn’t require the same high-powered overhead lighting you would need for such a scene, resulting in beautifully layered compositions that could have never been achieved in 3D just a few years ago.

Special effects technician and coordinator Donald R. Elliott, who won an Oscar for “Life of Pi” and worked on such films as Jurassic Park (1993) and Pirates of the Caribbean: Dead Man’s Chest (2006), also explored the mixture of the physical and the digital when he unveiled images of the huge water tank constructed for the film’s production in Taiwan. In order to create natural-looking ocean waves rather than artificial splashing like a bathtub, the crew had to utilize wave generators to push all of the water in one direction, resulting in a natural flow of water that would look convincing on camera. They also constructed multiple versions of Pi’s lifeboat including one with a pair of cages covering half of the structure – one for the real tiger playing Richard Parker in some shots and another for the tiger’s trainer.

Oscar-winning visual effects supervisor Bill Westenhofer, who delivered the Academy Awards acceptance speech for the film’s Visual Effects team at the 85th Academy Awards earlier in the year, confessed he wasn’t a big fan of 3D going into the film. He also won an earlier Academy Award for The Golden Compass (2007) and was nominated for The Chronicles of Narnia: The Lion, the Witch and the Wardrobe (2005), but this was the project that finally converted him over to the format’s possibilities. He walked through the ways 3D requires increased precision and extra prep work, including pinpointing established 2D cinema tools that wouldn’t work in 3D like matte paintings and traditional element shots like water splashing close to the camera. He also played some useful production footage of “convergence maps” with red and blue overlays on elements of a shot allowing the crew to determine any possible alignment issues or imperfections in the 3D compositions. This enabled them to be extremely specific when they addressed challenges like blending the tank water with CGI oceans, which could also be digitally graded with additional color timing and lighting to create the illusion of a larger environment and convey the desired mood for director Ang Lee. “I look at what we pulled off,” he mused, “and I’m not quite sure how we did it.”

Westenhofer also addressed a topic many audience members were looking forward to: the creation of the digital Richard Parker, a photorealistic tiger unlike any CG-created animal to date. It took over a year for the effects team to build the virtual Bengal tiger, which featured over 10 millions hairs and would take up to 30 hours to render in a single frame of footage. His earlier experience also came in handy when they used the digital Aslan character from “Narnia” as a test to see how it would look in 3D, proving that a big cat could indeed come to life in the format. The resulting film featured 690 effects shots out of a total of 960 shots for the entire film, with up 260 TB of hard disk space used at its peak.

Some more animal creation insights were offered by the final guest of the evening, Oscar-winning character animator Erik-Jan De Boer. Using test footage and fascinating test shots, he covered some of the 560 different animals created for the film including (in addition to Richard Parker himself) a zebra, hyena, orangutan, and of course, an entire island occupied by meerkats. Some of the animation exercises were rotoscoped from (or drawn over) footage of real tigers, coming shockingly close to the real thing but with enhanced lighting and detailing to show off its muscles and skin. The sense of the animal’s weight was conveyed through such touches as showing its paws contacting the ground and the toes and claws flexing, while the entire tiger had to be built in stages from a skeleton, adding layers at a time like muscles, subcutaneous tissue, skin, and finally fur guides. Most amusingly, he also demonstrated via photos how a blue stuffed animal was used on set to stand in for the tiger with actor Suraj Sharma (who, amusingly, couldn’t actually swim when he started the film).

The entire panel was reunited onstage for a brief final wrap up, with Squyres getting the biggest gasp when he explained that editing the film required him to wear 3D all day almost constantly for two years. How did he cope? “A lot of Advil!” One of his remarks at the start of production was also repeated and served as a perfect summary for the whole ambitious adventure: “We’ll know how to make this movie when we finish it.”